# Large Parameter Language Model
Proofgpt V0.1 6.7B
MIT
ProofGPT-v0.1 is a language model based on the GPT-NeoX architecture with 6.7 billion parameters, trained on the proof-pile dataset.
Large Language Model
Transformers English

P
hoskinson-center
168
10
Meta Llama 3 8B Instruct
Meta's 8-billion-parameter instruction fine-tuned large language model, optimized for dialogue scenarios and surpassing most open-source chat models in benchmarks
Large Language Model
Transformers English

M
meta-llama
1.2M
3,933
Opt 2.7b
Other
OPT is an open-source large language model series launched by Meta AI, with parameter scales ranging from 125 million to 175 billion, aimed at promoting open research in large-scale language models.
Large Language Model English
O
facebook
53.87k
83
Featured Recommended AI Models